Exponential Hebbian On-Line Learning Implemented in FPGAs

نویسندگان

  • Markus L. Rossmann
  • T. Jost
  • Karl Goser
  • Andreas Bühlmeier
  • Gerhard Manteuffel
چکیده

Abstract. Hebbian learning is a local learning algorithm and allows an on-line adaptation of the weights. Therefore an artificial neural network with built-in hebbian learning is capable of learning on operation. This paper presents the implementation of this algorithm in a digital Field Programmable Gate Array (FPGA). Nonlinearity is introduced by applying nonlinear low-pass-filtering to all input signals as well as using exponential shaped adaptation of weights with different time constants for rising and falling. Employing a complete serial design for the data flow, the implementation of overall 8 synapses in a single FPGA device gets possible. The neuron comprises four conventional synapses with fixed weights and four hebbian synapses for exponential on-line learning. Experiments show the improved performance of this system compared with a linear solution.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Hardware Implementation of a Network of Functional Spiking Neurons with Hebbian Learning

Nowadays, networks of artificial spiking neurons might contain thousands of synapses. Although software solutions offer flexibility, their performance decreases while increasing the number of neurons and synapses. Embedded systems very often require real-time execution, which do not allow an unconstrained increasing of the execution time. On the other hand, hardware solutions, given their inher...

متن کامل

On-Line Hebbian Learning for Spiking Neurons: Architecture of the Weight-Unit of NESPINN

We present the implementation of on-line Hebbian learning for NESPINN, the Neurocomputer for the simulation of spiking neurons. In order to support various forms of Hebbian learning we developed a programmable weight unit for the NESPINN-system. On-line weight modifications are performed event-controlled in parallel to the computation of basic neuron functions. According to our VHDL-simulations...

متن کامل

Differential Power Analysis: A Serious Threat to FPGA Security

Differential Power Analysis (DPA) implies measuring the supply current of a cipher-circuit in an attempt to uncover part of a cipher key. Cryptographic security gets compromised if the current waveforms obtained correlate with those from a hypothetical power model of the circuit. As FPGAs are becoming integral parts of embedded systems and increasingly popular for cryptographic applications and...

متن کامل

On-Line Learning with Restricted Training Sets: An Exactly Solvable Case

We solve the dynamics of on-line Hebbian learning in large perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Our calculation cannot be extended to non-Hebbian rules, but the solution provides a convenient and welcome benchmark with which to test more general and advanced theories for s...

متن کامل

On-Line Learning with Restricted Training Sets: Exact Solution as Benchmark for General Theories

We solve the dynamics of on-line Hebbian learning in perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Our calculation cannot be extended to non-Hebbian rules, but the solution provides a nice benchmark to test more general and advanced theories for solving the dynamics of learning wit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996